hadron collider
A Crucial Particle Physics Computer Program Risks Obsolescence
Recently, I watched a fellow particle physicist talk about a calculation he had pushed to a new height of precision. A 1980s-era computer program called FORM. Particle physicists use some of the longest equations in all of science. To look for signs of new elementary particles in collisions at the Large Hadron Collider, for example, they draw thousands of pictures called Feynman diagrams that depict possible collision outcomes, each one encoding a complicated formula that can be millions of terms long. Summing formulas like these with pen and paper is impossible; even adding them with computers is a challenge.
The Download: Tweaking AI for energy efficiency, and China's leaked data
What's the news?: Deep learning is behind machine learning's most high-profile successes. But this incredible performance comes at a cost: training deep-learning models requires huge amounts of energy. Now, new research shows how scientists who use cloud platforms to train algorithms can dramatically reduce the energy they use, and therefore the emissions they create. How can they do it?: Simple changes to cloud settings are the key. Researchers created a tool that measures the electricity usage of any machine-learning program that runs on Azure, Microsoft's cloud service, during every phase of their project.
- Asia > China (0.40)
- North America > United States > New York > New York County > New York City (0.05)
- Europe (0.05)
- Asia > Sri Lanka (0.05)
- Energy > Renewable (0.75)
- Law (0.74)
- Information Technology > Services (0.56)
- Government (0.52)
Scientists make first detection of exotic "X" particles in quark-gluon plasma
In the first millionths of a second after the Big Bang, the universe was a roiling, trillion-degree plasma of quarks and gluons -- elementary particles that briefly glommed together in countless combinations before cooling and settling into more stable configurations to make the neutrons and protons of ordinary matter. In the chaos before cooling, a fraction of these quarks and gluons collided randomly to form short-lived "X" particles, so named for their mysterious, unknown structures. Today, X particles are extremely rare, though physicists have theorized that they may be created in particle accelerators through quark coalescence, where high-energy collisions can generate similar flashes of quark-gluon plasma. Now physicists at MIT's Laboratory for Nuclear Science and elsewhere have found evidence of X particles in the quark-gluon plasma produced in the Large Hadron Collider (LHC) at CERN, the European Organization for Nuclear Research, based near Geneva, Switzerland. The team used machine-learning techniques to sift through more than 13 billion heavy ion collisions, each of which produced tens of thousands of charged particles.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.40)
- Europe > Switzerland > Geneva > Geneva (0.25)
- Asia > Japan (0.05)
CERN is making the Large Hadron Collider's data more accessible
The European Organization for Nuclear Research (CERN) will open up access to more data from Large Hadron Collider (LHC) experiments. Under an updated policy, data will be released around five years after it's collected and CERN hopes to release the full dataset publicly "by the close of the experiment concerned." Core LHC collaborators ALICE, ATLAS, CMS and LHCb all endorsed the move. CERN will make level 3 data available, which will allow anyone to conduct "high-quality analysis" on information obtained from Large Hadron Collider experiments. Level 3 relates to "calibrated reconstructed data with the level of detail useful for algorithmic, performance and physics studies," according to CERN.
Internet2 Taps Two Research teams for Final Phase of E-CAS Project
WASHINGTON, D.C., Sept. 3, 2020 – Internet2 confirmed the selection of two research teams using an external academic review panel for the second and final phase of the Exploring Clouds for Acceleration of Science (E-CAS) project that was first announced in November 2018. The second phase of the E-CAS project builds on lessons learned and leading practices that have been identified by the six research proposals that were selected in March 2019 with the goal of producing a deeper understanding of the use of cloud computing in accelerating scientific discoveries. "The first phase of the E-CAS project supported six teams to develop their computational workflows and test them at scale, and the results from all teams were very impressive," said Howard Pfeffer, president and CEO, Internet2. "Now in the second phase, the teams from MIT and SUNY Downstate have the opportunity to build on their technological achievements using the commercial cloud platforms with a focus on the scientific outcomes of their work." The research team from MIT has developed a range of new tools and code to take advantage of the newest graphical processing units (GPUs) and field programmable gate arrays (FPGAs) to perform accelerated machine learning tasks in Amazon Web Services (AWS) and Google Cloud using remote procedure calls from their main workflows running on high-performance clusters at MIT and FermiLab.
Berkeley Lab Cosmologists Are Top Contenders in Machine Learning Challenge
The 2020 LHC Olympics challenged teams to develop a machine learning code to find a hidden signal in particle-collision data. This image shows particle-collision data captured by the ATLAS detector at CERN's Large Hadron Collider. In searching for new particles, physicists can lean on theoretical predictions that suggest some good places to look and some good ways to find them: It's like being handed a rough sketch of a needle hidden in a haystack. But blind searches are a lot more complicated, like hunting in a haystack without knowing what you are looking for. To find what conventional computer algorithms and scientists may overlook in the huge volume of data collected in particle collider experiments, the particle physics community is turning to machine learning, an application of artificial intelligence that can teach itself to improve its searching skills as it sifts through a haystack of data.
- North America > United States > New York (0.05)
- Europe > Germany (0.05)
A glimpse into the future: accelerated computing for accelerated particles
Every proton collision at the Large Hadron Collider is different, but only a few are special. The special collisions generate particles in unusual patterns -- possible manifestations of new, rule-breaking physics -- or help fill in our incomplete picture of the universe. Finding these collisions is harder than the proverbial search for the needle in the haystack. But game-changing help is on the way. Fermilab scientists and other collaborators successfully tested a prototype machine-learning technology that speeds up processing by 30 to 175 times compared to traditional methods.
- North America > United States > Virginia (0.05)
- North America > United States > Illinois > Cook County > Chicago (0.05)
Google Street View Now Lets You Take a Walk Around the Large Hadron Collider
Ever wanted to take a peek inside an underground particle accelerator? Want your favorite British actor to walk you through the origin of the universe? While you can't stick your head into the Large Hadron Collider, you can now go for a short walk around it -- and explore other scientific marvels, thanks to Google's new online invention exhibition project, part of its Arts & Culture platform. With AR apps, AI-powered image galleries, and first-person views of underground science facilities, you might encounter more than a few surprising origin stories concerning mankind's most ambitious discoveries. The star here is Google's new Street View-powered tour of the Large Hadron Collider (LHC), the famous CERN-run particle accelerator.
- Europe > Switzerland (0.06)
- Europe > France (0.06)
Physicists Develop New Techniques to Enhance Data Analysis for Large Hadron Collider
Researchers at New York University developed new machine learning techniques to improve data analysis for the Large Hadron Collider, the world's most powerful particle accelerator. New York University (NYU) researchers have developed machine learning techniques that can significantly improve data analysis for the Large Hadron Collider (LHC), the world's most powerful particle accelerator. The researchers had previously developed statistical tools and methodology to perform measurements of the Higgs boson. The new methods offer the possibility of additional breakthrough discoveries. NYU researcher Kyle Cranmer says simulations often provide the best descriptions of a complicated phenomenon, but they are difficult to use in the context of data analysis.
- North America > United States > New York (0.60)
- North America > United States > Maryland > Montgomery County > Bethesda (0.09)